Local Factor Analysis with Automatic Model Selection and Data Smoothing Based Regularization

نویسنده

  • Lei Shi
چکیده

Local factor analysis (LFA) is regarded as an efficient approach that implements local feature extraction and dimensionality reduction. A further investigation is made on an automatic BYY harmony data smoothing LFA (LFA-HDS) from the Bayesian Ying-Yang (BYY) harmony learning point of view. On the level of regularization, an data smoothing based regularization technique is adapted into this automatic LFA-HDS learning for problems with small sample sizes, while on the level of model selection, the proposed automatic LFA-HDS algorithm makes parameter learning with automatic determination of both the component number and the factor number in each component. A comparative study has been conducted on simulated data sets and several real problem data sets. The algorithm has been compared with not only a recent approach called Incremental Mixture of Factor Analysers (IMoFA) but also the conventional two-stage implementation of maximum likelihood (ML) plus model selection, namely, using the EM algorithm for parameter learning on a series candidate models, and selecting one best candidate by AIC, CAIC, BIC, and cross-validation (CV). Experiments have shown that IMoFA and ML-BIC, ML-CV outperform ML-AIC or ML-CAIC. Interestingly, the data smoothing BYY harmony learning obtains comparably desired results compared to IMoFA and ML-BIC but with much less computational cost.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic Smoothing and Variable Selection via Regularization

This thesis focuses on developing computational methods and the general theory of automatic smoothing and variable selection via regularization. Methods of regularization are a commonly used technique to get stable solution to ill-posed problems such as nonparametric regression and classification. In recent years, methods of regularization have also been successfully introduced to address a cla...

متن کامل

A Trend on Regularization and Model Selection in Statistical Learning: A Bayesian Ying Yang Learning Perspective

In this chapter, advances on regularization and model selection in statistical learning have been summarized, and a trend has been discussed from a Bayesian Ying Yang learning perspective. After briefly introducing Bayesian YingYang system and best harmony learning, not only its advantages of automatic model selection and of integrating regularization and model selection have been addressed, bu...

متن کامل

Some Problems in Model Selection 1

This dissertation consists of three parts: the first two parts are related to smoothing spline ANOVA models; the third part concerns the Lasso and its related procedures in model selection. In Part I, by adopting the Cox proportional hazard model to quantify the hazard function, we propose a novel nonparametric model selection technique to analyze time to event data, within the framework of smo...

متن کامل

Automatic model selection for partially linear models

We propose and study a unified procedure for variable selection in partially linear models. A new type of double-penalized least squares is formulated, using the smoothing spline to estimate the nonparametric part and applying a shrinkage penalty on parametric components to achieve model parsimony. Theoretically we show that, with proper choices of the smoothing and regularization parameters, t...

متن کامل

Studies of model selection and regularization for generalization in neural networks with applications

This thesis investigates the generalization problem in artificial neural networks, attacking it from two major approaches: regularization and model selection. On the regularization side, under the framework of Kullback–Leibler divergence for feedforward neural networks, we develop a new formula for the regularization parameter in Gaussian density kernel estimation based on available training da...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006